Cryptography based on the Hardness of Decoding
نویسنده
چکیده
Public key cryptography is like magic. It allows two people who have never met before to communicate privately over any public channel. Since its conception in the 1970s, public key cryptography has become indispensable for the modern day connected world. Public key cryptography is essential for the https protocol, ecommerce, online auctions, online elections and many more. Computational hardness makes public key cryptography work. However, there is only a handful of hardness assumptions known which suffice for the construction of public key cryptosystems. Early public key cryptosystems like RSA and ElGamal are based on numbertheoretic problems, such as the factoring problem and discrete logarithm problem. By their nature, these problems are highly structured. While it is the structure of these problems that enables the construction of public key cryptosystems in the first place, this structure also gives rise to highly non-trivial attacks. As a consequence, all public key cryptosystems based on number-theoretic assumptions can be efficiently broken by quantum computers and there are very few candidates that resist subexponential-time classical attacks. This fact raised concerns about the hardness of these problems. As a promising alternative to number-theoretic hardness assumptions, coding and lattice-based hardness-assumptions have emerged. Most prominent among these are the learning parity with noise (LPN) and the learning with errors (LWE) problem. Such problems appear naturally in coding-theory and have resisted more than 50 years of algorithmic/cryptanalytic efforts, both classically and quantumly. This thesis provides progress for both LPNand LWE-based cryptography. As main contribution of this thesis, we provide two constructions of adaptively secure public key cryptosystems. Adaptive chosen-ciphertext (IND-CCA2) security is the gold-standard of security definitions for public key cryptography. IND-CCA2 secure cryptosystems must withstand attacks by an adversary having access to a decryption-oracle that decrypts every ciphertext except for a special challengeciphertext, for which the adversary is tasked with guessing its corresponding plaintext. Our first proposal is based on the McEliece assumption and the LPN problem. The second one is based solely based on the hardness of a low noise LPN problem. This construction was the first of its kind and answered a problem that has been open for 9 years. The second contribution of this thesis regards the LWE problem. The most important feature of the LWE problem is its worst-case hardness guarantee. Simply put, this means that almost all instances of the problem are as hard as the hardest instance of the problem. Such a feature is highly desirable in cryptography, as it guarantees that a cryptosystem based on this problem has essentially no weak keys or weak ciphertexts. This worst case guarantee however is established using
منابع مشابه
A Novel Patch-Based Digital Signature
In this paper a new patch-based digital signature (DS) is proposed. The proposed approach similar to steganography methods hides the secure message in a host image. However, it uses a patch-based key to encode/decode the data like cryptography approaches. Both the host image and key patches are randomly initialized. The proposed approach consists of encoding and decoding algorithms. The encodin...
متن کاملOn the Hardnesses of Several Quantum Decoding Problems
We classify the time complexities of three important decoding problems for quantum stabilizer codes. First, regardless of the channel model, quantum bounded distance decoding is shown to be NP-hard, like what Berlekamp, McEliece and Tilborg did for classical binary linear codes in 1978. Then over the depolarizing channel, the decoding problems for finding a most likely error and for minimizing ...
متن کاملar X iv : 1 30 6 . 51 73 v 2 [ qu an t - ph ] 1 1 Ju l 2 01 3 On the Hardnesses of Several Quantum Decoding Problems ∗
We classify the time complexities of three important decoding problems for quantum stabilizer codes. First, regardless of the channel model, quantum bounded distance decoding is shown to be NP-hard, like what Berlekamp, McEliece and Tilborg did for classical binary linear codes in 1978. Then over the depolarizing channel, the decoding problems for finding a most likely error and for minimizing ...
متن کاملOn Bounded Distance Decoding, Unique Shortest Vectors, and the Minimum Distance Problem
We prove the equivalence, up to a small polynomial approximation factor √ n/ log n, of the lattice problems uSVP (unique Shortest Vector Problem), BDD (Bounded Distance Decoding) and GapSVP (the decision version of the Shortest Vector Problem). This resolves a long-standing open problem about the relationship between uSVP and the more standard GapSVP, as well the BDD problem commonly used in co...
متن کاملSyndrome Decoding in the Non-Standard Cases
In the late 70’s the McEliece cryptosystem was invented and the syndrome decoding problem was proven to be NP-complete. The proof of NP-completeness shows that among some instances (those which can be derived from a 3D mapping problem), some are difficult to solve. The fact that no attack has yet been found on the McEliece cryptosystem tends to show that for standard parameters (like those used...
متن کاملRepairing the Faure-Loidreau Public-Key Cryptosystem
A repair of the Faure–Loidreau (FL) public-key code-based cryptosystem is proposed. The FL cryptosystem is based on the hardness of list decoding Gabidulin codes which are special rank-metric codes. We prove that the structural attack on the system by Gaborit et al. is equivalent to decoding an interleaved Gabidulin code. Since all known polynomial-time decoders for these codes fail for a large...
متن کامل